Goto

Collaborating Authors

 efficient neural network robustness certification



Reviews: Efficient Neural Network Robustness Certification with General Activation Functions

Neural Information Processing Systems

Summary: This paper proposes a general framework CROWN to efficiently certify robustness of neural networks with general activation functions. CROWN adaptively bounds a given activation function with linear and quadratic functions, so it can tackle general activation functions including but not limited to the four popular choices: ReLU, tanh, sigmoid, and arctan. Experimental results demonstrate the effectiveness, efficiency, and flexibility of the proposed framework. Quality: We are glad to find a work which conducts the efficiently certifying of the non-trivial robustness for general activation functions in neural networks. It is also interesting that the proposed framework can flexibly select upper bounds and lower bounds which can reduce the approximation error.


Efficient Neural Network Robustness Certification with General Activation Functions

Zhang, Huan, Weng, Tsui-Wei, Chen, Pin-Yu, Hsieh, Cho-Jui, Daniel, Luca

Neural Information Processing Systems

Finding minimum distortion of adversarial examples and thus certifying robustness in neural networks classifiers is known to be a challenging problem. Nevertheless, recently it has been shown to be possible to give a non-trivial certified lower bound of minimum distortion, and some recent progress has been made towards this direction by exploiting the piece-wise linear nature of ReLU activations. However, a generic robustness certification for \textit{general} activation functions still remains largely unexplored. To address this issue, in this paper we introduce CROWN, a general framework to certify robustness of neural networks with general activation functions. The novelty in our algorithm consists of bounding a given activation function with linear and quadratic functions, hence allowing it to tackle general activation functions including but not limited to the four popular choices: ReLU, tanh, sigmoid and arctan.